On Boosting Sparse Parities
نویسنده
چکیده
While boosting has been extensively studied, considerably less attention has been devoted to the task of designing good weak learning algorithms. In this paper we consider the problem of designing weak learners that are especially adept to the boosting procedure and specifically the AdaBoost algorithm. First we describe conditions desirable for a weak learning algorithm. We then propose using sparse parity functions as weak learners, which have many of our desired properties, as weak learners in boosting. Our experimental tests show the proposed weak learners to be competitive with the most widely used ones: decision stumps and pruned decision trees.
منابع مشابه
On Noise-Tolerant Learning of Sparse Parities and Related Problems
We consider the problem of learning sparse parities in the presence of noise. For learning parities on r out of n variables, we give an algorithm that runs in time poly ( log 1 δ , 1 1−2η ) n( +o(1))r/2 and uses only r log(n/δ)ω(1) (1−2η)2 samples in the random noise setting under the uniform distribution, where η is the noise rate and δ is the confidence parameter. From previously known result...
متن کاملA Boosting Framework on Grounds of Online Learning
By exploiting the duality between boosting and online learning, we present a boosting framework which proves to be extremely powerful thanks to employing the vast knowledge available in the online learning area. Using this framework, we develop various algorithms to address multiple practically and theoretically interesting questions including sparse boosting, smooth-distribution boosting, agno...
متن کاملGeneralized Dictionary for Multitask Learning with Boosting
While multitask learning has been extensively studied, most existing methods rely on linear models (e.g. linear regression, logistic regression), which may fail in dealing with more general (nonlinear) problems. In this paper, we present a new approach that combines dictionary learning with gradient boosting to achieve multitask learning with general (nonlinear) basis functions. Specifically, f...
متن کاملAccelerated Gradient Boosting
Gradient tree boosting is a prediction algorithm that sequentially produces a model in the form of linear combinations of decision trees, by solving an infinite-dimensional optimization problem. We combine gradient boosting and Nesterov’s accelerated descent to design a new algorithm, which we call AGB (for Accelerated Gradient Boosting). Substantial numerical evidence is provided on both synth...
متن کاملFinding Correlations in Subquadratic Time, with Applications to Learning Parities and Juntas with Noise PRELIMINARY VERSION
Given a set of n random d-dimensional boolean vectors with the promise that two of them are ρ-correlated with each other, how quickly can one find the two correlated vectors? We present a surprising and simple algorithm which, for any constant > 0 runs in (expected) time dn 3ω 4 + poly( 1 ρ ) < dn·poly( 1 ρ ), where ω < 2.4 is the exponent of matrix multiplication. This is the first subquadrati...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2014